Conversation
edge_config_example.py
Outdated
| @@ -0,0 +1,41 @@ | |||
| """Example of constructing an edge endpoint configuration programmatically.""" | |||
There was a problem hiding this comment.
This is temporary, I'll get rid of this before merging.
There was a problem hiding this comment.
Getting rid of it is too easy, grab the most important snippets and put them in docs in a md file, we auto push those to a webpage
brandon-wada
left a comment
There was a problem hiding this comment.
This looks good so far, we should discuss more what the calls to update the config on the remote machine should look like
edge_config_example.py
Outdated
| @@ -0,0 +1,41 @@ | |||
| """Example of constructing an edge endpoint configuration programmatically.""" | |||
There was a problem hiding this comment.
Getting rid of it is too easy, grab the most important snippets and put them in docs in a md file, we auto push those to a webpage
| RootEdgeConfig, | ||
| ) | ||
|
|
||
| __all__ = [ |
There was a problem hiding this comment.
Nit, this is a little redundant since it includes all objects that could be importet
There was a problem hiding this comment.
ChatGPT says: "good point. all is kept intentionally to make the public API explicit/stable for this new module (and to control wildcard-import surface), but it can be removed if repo convention prefers omitting it."
I'm not really sure which way is best...
There was a problem hiding this comment.
For now it's fine either way, worst that can happen leaving it in is that someone adds something later and gets caught off guard when it doesn't import the way they expect. Leave it since it's less keystrokes now that it's already in
…nto tim/edge-config
…nto tim/edge-config
Resolve conflict in edge config validation by keeping shared helper-based validation logic. Made-with: Cursor
…nto tim/edge-config
Resolve the merge conflict in test_edge_config and preserve current edge config tests. Made-with: Cursor
brandon-wada
left a comment
There was a problem hiding this comment.
Config usage isn't beat-me-over-the-head-obvious, but that may just be the nature of our existing edge endpoint code. There's nothing here though that I think will create a structural problem so I'll approve, so long as we're open to revisiting as we add the CRUD methods that where we start modifying remote state
|
|
||
|
|
||
| def test_edge_endpoint_config_is_not_subclass_of_detectors_config(): | ||
| assert not issubclass(EdgeEndpointConfig, DetectorsConfig) |
There was a problem hiding this comment.
odd test, any particular reason for this one?
There was a problem hiding this comment.
Sorry, this was a silly AI test that I swore I removed. Maybe I removed some other version of this test, but overlooked this one. I will remove it.
| Detector and inference-config mappings for edge inference. | ||
| """ | ||
|
|
||
| edge_inference_configs: dict[str, InferenceConfig] = Field(default_factory=dict) |
There was a problem hiding this comment.
This is just for internal bookkeeping? Users will never instantiate a DetectorsConfig object with populated arguments, but rather should create an empty DetectorsConfig and then use the add_detector method?
| """ | ||
| for name, config in self.edge_inference_configs.items(): | ||
| if name != config.name: | ||
| raise ValueError(f"Edge inference config key '{name}' must match InferenceConfig.name '{config.name}'.") |
There was a problem hiding this comment.
Given this validation, why is this stored as a dict to begin with? It could be a list of InferenceConfigs and you scan the list for the config with the name you want. More expensive? Sure, but the configs should be optimized for readability rather than performance. If you have so many inference configs that you're worried about performance here then there are going to be a lot of other problems first
Moves the edge endpoint config schema from
edge-endpointinto the Python SDK so users can programmatically build and validate edge configurations.What's new
groundlight.edgesubmodule with Pydantic models ported fromedge-endpoint/app/core/configs.py:EdgeEndpointConfig-- top-level config (includesglobal_config) with ergonomicadd_detector()supportDetectorsConfig-- detector-scoped config model for detector-only workflowsInferenceConfig-- per-detector inference behavior (frozen/immutable)GlobalConfig,DetectorConfig-- supporting modelsDEFAULTEDGE_ANSWERS_WITH_ESCALATIONNO_CLOUDDISABLEDdisable_cloud_escalationrequiresalways_return_edge_prediction=Truedetectors[*].edge_inference_configmust exist inedge_inference_configs)Usage
Design decisions
InferenceConfigis frozen to prevent accidental mutation of shared preset objects.add_detector()accepts either a detector ID string or aDetectorobject (consistent with existing SDK ergonomics).add_detector(); no separate registration step is needed.EdgeEndpointConfigcomposesDetectorsConfigand delegates detector mutations throughadd_detector().DetectorsConfig.to_payload()provides explicit detector-scoped wire serialization for future HTTP use.detectorsas a list,edge_inference_configsas a dict keyed by config name).nameonInferenceConfigis excluded from serialization because it is represented as the dict key.Quality and tests
test/unit/test_edge_config.py, including:add_detector()duplicate detector ID rejectionedge_inference_configreferencesInferenceConfigobjectsInferenceConfigvalidation constraintsDetectorsConfig.to_payload()payload shapeEdgeEndpointConfig.model_dump()payload shapeFollow-up work
edge-endpointto import these models from the Python SDK instead of defining them locally.POST /set-edge-config(full config replace)GET /edge-config(get current active config)POST /set-edge-detectorswith modereplace|mergeset_edge_endpoint_config(config, *, timeout_s=...)get_edge_endpoint_config()set_edge_endpoint_detectors(detectors, *, mode="replace"|"merge", timeout_s=...)remove_edge_endpoint_detectors(detector_ids, *, timeout_s=...)